269 research outputs found

    Risk assessment for slope monitoring

    Get PDF
    One main goal of geodetic deformation monitoring and analysis is minimizing the risk of unexpected collapses of artificial objects and geologic hazards. Nowadays, the methodology in applied geodesy and mathematically founded decisions are usually based on probabilities and significance levels but not on the risk (consequences or costs) itself. In this study, a new concept which is based on the utility theory is introduced to the current methodology. It allows the consideration of consequences or costs for geodetic decision making in order to meet the real requirements. In this case, possible decisions are evaluated with cost functions for type I and II errors. Finally, the decision leading to the minimum costs or consequences is chosen as the most beneficial one. This procedure allows also identifying the most beneficial additional measurements to reduce the risk of an individual monitoring process. In the last part, the theoretical concept is applied to an example in slope monitoring.DFG/NE 1453/3-

    Utility theory as a method to minimise the risk in deformation analysis decisions

    Get PDF
    Deformation monitoring usually focuses on the detection of whether the monitored objects satisfy the given properties (e.g. being stable or not), and makes further decisions to minimise the risks, for example, the consequences and costs in case of collapse of artificial objects and/or natural hazards. With this intention, a methodology relying on hypothesis testing and utility theory is reviewed in this paper. The main idea of utility theory is to judge each possible outcome with a utility value. The presented methodology makes it possible to minimise the risk of an individual monitoring project by considering the costs and consequences of overall possible situations within the decision process. It is not the danger that the monitored object may collapse that can be reduced. The risk (based on the utility values multiplied by the danger) can be described more appropriately and therefore more valuable decisions can be made. Especially, the opportunity for the measurement process to minimise the risk is an important key issue. In this paper, application of the methodology to two of the classical cases in hypothesis testing will be discussed in detail: 1) both probability density functions (pdfs) of tested objects under null and alternative hypotheses are known; 2) only the pdf under the null hypothesis is known and the alternative hypothesis is treated as the pure negation of the null hypothesis. Afterwards, a practical example in deformation monitoring is introduced and analysed. Additionally, the way in which the magnitudes of utility values (consequences of a decision) influence the decision will be considered and discussed at the end

    Finite element analysis based on a parametric model by approximating point clouds

    Get PDF
    Simplified models are widely applied in finite element computations regarding mechanical and structural problems. However, the simplified model sometimes causes many deviations in the finite element analysis (FEA) of structures, especially in the non-designed structures which have undergone unknowable deformation features. Hence, a novel FEA methodology based on the parametric model by approximating three-dimensional (3D) feature data is proposed to solve this problem in the present manuscript. Many significant anci effective technologies have been developeci to detect 3D feature information accurately, e.g., terrestrial laser scanning (TLS), digital photogrammetry, and radar technology. In this manuscript, the parametric FEA model combines 3D point clouds from TLS and the parametric surface approximation method to generate 3D surfaces and models accurately. TLS is a popular measurement method for reliable 3D point clouds acquisition and monitoring deformations of structures with high accuracy and precision. The B-spline method is applied to approximate the measured point clouds data automatically and generate a parametric description of the structure accurately. The final target is to reduce the effects of the model description and deviations of the FEA. Both static and dynamic computations regarding a composite structure are carried out by comparing the parametric and general simplified models. The comparison of the deformation and equivalent stress of future behaviors are reflected by different models. Results indicate that the parametric model based on the TLS data is superior in the finite element computation. Therefore, it is of great significance to apply the parametric model in the FEA to compute and predict the future behavior of the structures with unknowable deformations in engineering accurately

    The benefit of 3D laser scanning technology in the generation and calibration of FEM models for health assessment of concrete structures

    Get PDF
    Terrestrial laser scanning technology (TLS) is a new technique for quickly getting three-dimensional information. In this paper we research the health assessment of concrete structures with a Finite Element Method (FEM) model based on TLS. The goal focuses on the benefits of 3D TLS in the generation and calibration of FEM models, in order to build a convenient, efficient and intelligent model which can be widely used for the detection and assessment of bridges, buildings, subways and other objects. After comparing the finite element simulation with surface-based measurement data from TLS, the FEM model is determined to be acceptable with an error of less than 5%. The benefit of TLS lies mainly in the possibility of a surface-based validation of results predicted by the FEM model

    Uncertainty modeling of random and systematic errors by means of Monte Carlo and fuzzy techniques

    Get PDF
    The standard reference in uncertainty modeling is the “Guide to the Expression of Uncertainty in Measurement (GUM)”. GUM groups the occurring uncertain quantities into “Type A” and “Type B”. Uncertainties of “Type A” are determined with the classical statistical methods, while “Type B” is subject to other uncertainties which are obtained by experience and knowledge about an instrument or a measurement process. Both types of uncertainty can have random and systematic error components. Our study focuses on a detailed comparison of probability and fuzzy-random approaches for handling and propagating the different uncertainties, especially those of “Type B”. Whereas a probabilistic approach treats all uncertainties as having a random nature, the fuzzy technique distinguishes between random and deterministic errors. In the fuzzy-random approach the random components are modeled in a stochastic framework, and the deterministic uncertainties are treated by means of a range-of-values search problem. The applied procedure is outlined showing both the theory and a numerical example for the evaluation of uncertainties in an application for terrestrial laserscanning (TLS).DFG/KU/1250/4-

    Robust Spatial Approximation of Laser Scanner Point Clouds by Means of Free-form Curve Approaches in Deformation Analysis

    Get PDF
    In many geodetic engineering applications it is necessary to solve the problem of describing a measured data point cloud, measured, e. g. by laser scanner, by means of free-form curves or surfaces, e. g., with B-Splines as basis functions. The state of the art approaches to determine B-Splines yields results which are seriously manipulated by the occurrence of data gaps and outliers. Optimal and robust B-Spline fitting depend, however, on optimal selection of the knot vector. Hence we combine in our approach Monte-Carlo methods and the location and curvature of the measured data in order to determine the knot vector of the B-Spline in such a way that no oscillating effects at the edges of data gaps occur. We introduce an optimized approach based on computed weights by means of resampling techniques. In order to minimize the effect of outliers, we apply robust M-estimators for the estimation of control points. The above mentioned approach will be applied to a multi-sensor system based on kinematic terrestrial laserscanning in the field of rail track inspection. © 2016 Walter de Gruyter GmbH, Berlin/Munich/Boston

    Terrestrial laser scanning technology for deformation monitoring and surface modeling of arch structures

    Get PDF
    Terrestrial laser scanning (TLS) is capable to be a reliable deformation monitoring device with high-precision for concrete composite structures. Measurements based on TLS for an arch structure with monotonic loading is carried out. In this paper, comparison between original and optimized extraction of point clouds are presented. Surface approximation is implemented, where the vacant measurement area is also covered and the uncertainties of different-order surfaces are investigated. The results of surface approximation based on TLS measurement have certain relation with surface roughness of specimen, which will be eliminated by subtraction in deformation calculation.DFGNatural Science Foundation of Jiangsu Provinc

    Wavelength-converted light sources in fluorescence-based methods in medical technology

    Get PDF
    This contribution proposes phosphors as excitation for fluorescence analysis and evaluates their potential in this application area. Aim of this research is to provide a method which allows to quantify how well a phosphor fits as excitation in a given optical system with spectral filters for fluorescence analysis. The approach consists of a mathematical calculation of crosstalk which is first applied to abstract and subsequentially defined phosphor emission spectra. The resulting crosstalk is used as measure indicating the fit of a phosphor spectrum. Result of this contribution is a detailed description of the applied method as well as an example exercise on a given optical system which gives an impression of possibilities phosphors offer in this application. The presented method is applicable to any (new) phosphor or even LED spectra. Especially evaluations on the example optical system allow conclusions which help to design future optical systems

    Using Machine-Learning for the Damage Detection of Harbour Structures

    Get PDF
    The ageing infrastructure in ports requires regular inspection. This inspection is currently carried out manually by divers who sense the entire below-water infrastructure by hand. This process is cost-intensive as it involves a lot of time and human resources. To overcome these difficulties, we propose scanning the above and below-water port structure with a multi-sensor system, and by a fully automated process to classify the point cloud obtained into damaged and undamaged zones. We make use of simulated training data to test our approach because not enough training data with corresponding class labels are available yet. Accordingly, we build a rasterised height field of a point cloud of a sheet pile wall by subtracting a computer-aided design model. The latter is propagated through a convolutional neural network, which detects anomalies. We make use of two methods: the VGG19 deep neural network and local outlier factors. We showed that our approach can achieve a fully automated, reproducible, quality-controlled damage detection, which can analyse the whole structure instead of the sample-wise manual method with divers. We were able to achieve valuable results for our application. The accuracy of the proposed method is 98.8% following a desired recall of 95%. The proposed strategy is also applicable to other infrastructure objects, such as bridges and high-rise buildings
    corecore